Improved Hopfield Networks by Training with Noisy Data
نویسندگان
چکیده
A new approach to training a generalized Hopfield network is developed and evaluated in this work. Both the weight symmetricity constraint and the zero selfconnection constraint are removed from standard Hopfield networks. Training is accomplished with BackPropagation Through Time, using noisy versions of the memorized patterns. Training in this way is referred to as Noisy Associative Training (NAT). Performance of NAT is evaluated on both random and correlated data. NAT has been tested on several data sets, with a large number of training runs for each experiment. The data sets used include uniformly distributed random data and several data sets adapted from the U.C. Irvine Machine Learning Repository. Results show that for random patterns, Hopfield networks trained with NAT have an average overall recall accuracy 6.1 times greater than networks produced with either Hebbian or Pseudo-Inverse training. Additionally, these networks have 13% fewer spurious memories on average than networks trained with PseudoInverse or Hebbian training. Typically, networks memorizing over 2N (where N is the number of nodes in the network) patterns are produced. Performance on correlated data shows an even greater improvement over networks produced with either Hebbian or Pseudo-Inverse training An average of 27.8 times greater recall accuracy, with 14% fewer spurious memories.
منابع مشابه
محاسبه ظرفیت شبکه عصبی هاپفیلد و ارائه روش عملی افزایش حجم حافظه
The capacity of the Hopfield model has been considered as an imortant parameter in using this model. In this paper, the Hopfield neural network is modeled as a Shannon Channel and an upperbound to its capacity is found. For achieving maximum memory, we focus on the training algorithm of the network, and prove that the capacity of the network is bounded by the maximum number of the ortho...
متن کاملA binary Hopfield network with $1/\log(n)$ information rate and applications to grid cell decoding
A Hopfield network is an auto-associative, distributive model of neural memory storage and retrieval. A form of error-correcting code, the Hopfield network can learn a set of patterns as stable points of the network dynamic, and retrieve them from noisy inputs – thus Hopfield networks are their own decoders. Unlike in coding theory, where the information rate of a good code (in the Shannon sens...
متن کاملRobust Exponential Memory in Hopfield Networks
The Hopfield recurrent neural network is a classical auto-associative model of memory, in which collections of symmetrically coupled McCulloch-Pitts binary neurons interact to perform emergent computation. Although previous researchers have explored the potential of this network to solve combinatorial optimization problems or store reoccurring activity patterns as attractors of its deterministi...
متن کاملRobust Discovery of Temporal Structure in Multi-neuron Recordings Using Hopfield Networks
We present here a novel method for the classical task of extracting reoccurring spatiotemporal patterns from spiking activity of large populations of neurons. In contrast to previous studies that mainly focus on synchrony detection or exactly recurring binary patterns, we perform the search in an approximate way that clusters together nearby, noisy network states in the data. Our approach is to...
متن کاملAnalysing and enhancing the performance of associative memory architectures
This thesis investigates the way in which information about the structure of a set of training data with `natural' characteristics may be used to positively influence the design of associative memory neural network models of the Hopfield type. This is done with a view to reducing the level of connectivity in models of this type. There are three strands to this work. Firstly, an empirical evalua...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2001